Skip to main content

tv   CNN Newsroom With Poppy Harlow and Jim Sciutto  CNN  October 5, 2021 7:00am-8:00am PDT

7:00 am
with no line-activation fees or term contract required. see if you can save by switching today. comcast business. powering possibilities. good morning. >> welcome to our viewers here in the u.s. and around the world. breaking news right now. critical pushback against the powers of social media today in a matter of minutes, facebook whistleblower will testify before a senate panel. she claim that is facebook is prioritizing profit over public good by knowingly peddling disinformation. in fact, hiding the results of its own internal studies. we're going to go live to the hearing once it begins.
7:01 am
>> in prepared testimony, the whistleblower plans to say, quote, i believe what i did was right and necessary for the common good. but i know facebook has infinite resources which it could use to destroy me. as long as facebook is operating in the dark, it is accountable to no one and will continue to make choices that go against the common good. let's continue with our correspondents. good to have you with us. as we are waiting, we got that opening statement. she also, she talks about how what she saw at facebook was so different from her experience at other social media companies. that stands out, too. >> yeah, i think we got a reminder of that. today is all about power. the power of facebook. whe we saw how powerful it was with the outage. one company controls three of the world's biggest platforms.
7:02 am
when one goes down, oftentimes they'll all go down, as we saw yesterday. but you're right, she said she has experience in other platforms and what she saw at facebook concerned her so much that she's now shortly going to be giving testimony in the room behind me. >> i'm going to quote from her opening stam. facebook, she says, became a $1 trillion company by paying for its praofits for its safety, including the safety of our children and that's unacceptable. this is not the first bout of bad pr for facebook here and yet the profit spigot keeps on spitting out money. and the disinformation continues to get shared. do you see this as a moment of change? >> every time there has been a facebook scandal, think about
7:03 am
cambridge analytica, they try to lay low, down then come. most users come back although they have lost in some ways around the edges around younger users who might like instagram, but not facebook, but since instagram is owned by facebook, mark szuckerberg always wins. protecting kids online. this is about a vulnerable population that gets addicted to these products at a very early age. any parent knows that to be true and we're only just beginning as a society so reckon with the consequences. so that is why her testimony, her decision to come forward, by the way, one of tens of thousands of employees and she decided she had to quit and speak this truth. she could make a difference because it's about kids. >> we had that father on who placed some blame on instagram for his daughter's suicide.
7:04 am
>> he talked about how his daughter didn't use social media a lot, but after her death, they recognized some of the places she was visiting and where it was pointing her towards, right? and when you talk about maybe kids aren't using facebook as much, but they are going to instagram. that's the concern. we both have teenagers and we see this. the concern is what it is being engineered to do and what it is feeding these kids in terms of body image issues. showing them that suicide is an answer as we heard from that father. >> when i interviewed a facebook exec executive over the weekend, he said this has always been a challenge for teenagers. for young women. fashion magazines being a difference. they're not as inl flurn.
7:05 am
she has not the only one inside facebook with these fears. >> people do not check fashion magazines 27 times a day. this is the chairman. richard blumenthal. >> ranking member, senator blackburn, for your cooperation and collaboration. we've been working very closely. and the ranking member who is here, senator wicker, as well as our chairwoman, maria cantwell. senator cantwell i'm sure will be here shortly. most important, i'd like to thank our witness for being here and the two counsel who are representing her today and i want to give you my heartfelt gratitude for your courage and strength in coming forward as you have done standing up to one of the most powerful corporate giants in the history of the
7:06 am
world without any exaggeration. you have a compelling, credible voice, which we've heard already, but you are not here alone. you're armed with documents and evidence. and you speak volumes as they do about how facebook has put profits ahead of people. among other revelations, the information you have provided to congress is powerful proof that facebook knew its products were harming teenagers. facebook exploited teens using powerful algorithms that amplified their insecurities and abuses through what it found was an addict's narrative. there is a question, which i hope you will diskucuss, as to whether there is such a thing as
7:07 am
a safe algorithm. facebook saw teens creating secret accounts that are often hidden from their parents. as unique value proposition. in their words, a unique valued proposition. a way to drive up numbers for advertisers and shareholders at the expense of safety. and it doubled down on targeting children, pushing products on preteens, not just teens, but preteens. that it knows are harmful to our kids' mental health and well-being. instead of telling parents, facebook concealed the facts. it sought to stone wall and block this information from becoming public, including to this committee when senator blackburn and i specifically asked the company. and still, even now, as of just last thursday when a facebook
7:08 am
witness came for this committee, it has refused disclosure or even when it might decide to disclose additional codocuments. they've continued their tactics even after they knew the d destruction they caused. they continued to profit from them. their profit was more important than the pain that they caused. last thursday, a message from miss antigone davis was simple. quote, this research is not a bombshell, end quote. and she repeated the line. not a bombshell. well, this research is the very definition of a bombshell. facebook and big tech are facing a big tobacco moment. a moment of reckoning.
7:09 am
parallel is striking. i sued big tobacco as connecticut's attorney general. i helped to lead the states in that and i remember very, very well the moment in the course of our litigation when we learned of those files that showed not only that big tobacco knew that its product caused cancer, but that they had done the research, they concealed the files, and now we knew and the world knew and big tech now faces that big tobacco, jaw dropping moment of truth. it is documented proof that facebook knows its products can be addictive and toxic to children and it's not just that they made money. f again, it's that they valued
7:10 am
their profit more than the pain that they caused to children and their families. the damage to self-interest and self-worth inflected by facebook today will haunt a generation. feelings of inadequacy, insecurity, self-hatred will impact this generation for years to come. our children are the ones who are victims. teens today looking at themselves in the mirror feel doubt and insecurity. mark zuckerberg ought to be looking at himself in the mirror today and yet rather than taking responsibility, and showing leadership, mr. zuckerberg is going sailing. his new moe tuss on prende, no
7:11 am
action, nothing to see here. mark zuckerberg, you need to come before this committee. you need to explain to frances, to us, to the world, and to the parents of america what you were doing and why you did it. instagram's business model is pretty straightforward. more eyeballs, more dollars. everything facebook does is to add more users and keep them on their apps for longer. in order to hook us, instagram uses our private information to precisely target us with content and recommendations, assessing that what will provoke a reaction will keep us scrolling. far too often, these reco recommendations are destructive behaviors. as we showed on thursday, we
7:12 am
created a fake account. my office and i did, as a teen interested in extreme dieting and eating disorders. instagram latched on to that teenager's initial insecurities and then pushed more content and recommendations. f glorifying eating disorders. that's how instagram's algorithms can push teens into darker and darker places. facebook's own researchers called it instagram's quote, perfect storm. exacerbating downward spirals. facebook, as you have put it so powerfully, maximizes profits and ignores pain. facebook's failure to acknowledge and to act makes it morally bankrupt. again and again, facebook rejected reforms recommended by its own researchers. last week, miss davis said
7:13 am
quote, we're looking at, end quote, no specific plans, no commitments, only vague platitudes. these documents that you have revealed provided this company with a blueprint, provide specific recommendation that could have made facebook and instagram say for the company repeatedly ignored those recommendations from its own researchers that would have made facebook and instagram safer. facebook researchers have suggested changing their recommendations to stop promoting accounts known to encourage dangerous body comparison. instead of making meaningful changes, facebook simply pays lip service and if they won't act, and if big tech won't act,
7:14 am
congress has to intervene. privacy protection is long overdue. senator markey and i have introduced the kids act, which would been addictive tactics that facebook uses to exploit children. parents deserve better tools to protect their children. i'm also a firm supporter of reforming section 230. we should consider narrowing this sweeping immunity when platform's algorithms amplify illegal conduct. perhaps you'll expand on it. we've heard compelling recommendations about requiring disclosures of research and independent reviews of these platform's algorithms and i plan to pursue these ideas. the securities and exchange commission should investigate your intentions and claims. and so should the federal trade
7:15 am
commission. facebook appears to have misled the public and investors an if that's correct, it ought to face real penalties as a result of that misleading and deceptive misrepresentation. i want to thank all my colleagues who are here today because what we have is a bipartisan congressional roadmap for reform that will safeguard and protect children from big tech. that will be a focus of our subcommittee moving forward and it will continue to be bipartisan and finally, i'll just end on this note. >> in the path weeks and days, parents have contacted me with their heartbreaking and spine chilling stories about children pushed into eating disorders, bullying online, self-injury of
7:16 am
the most disturbing kind and sometimes even taking their lives because of social media. parents are holding facebook accountable because of your bravery. and we need to hold accountable facebook and all big tech as well. again, my thanks to you. i am going to enter into the record a letter from 52 state attorneys general and from two members of the youth advisory board of sandy hook promise, as long as there's no objection and now i'll turn to the ranking member. >> thank you, mr. chairman, and thank you for entering that letter in the record that we have from our states attorneys general. good morning to everyone. it is nice to see people in this hearing room and to be here for the hearing today.
7:17 am
we thank you for your appearance before us today and for giving the opportunity not only for congress, but for the american people to hear from you in this setting and we appreciate that. mr. chairman, i think, also thanks to you and your staff to make certain that we have this hearing and this opportunity today so that we can get more insight into what facebook is actually doing as they innovate the privacy, not only of adults, but of children and look at the ways that they are this violation of the children's online privacy protection act, which is federal law. and looking at how they are evading that law and working around it. and as the chairman said, privacy and online privacy,
7:18 am
passing a federal privacy standard has been long in the works. i filed my first privacy bill when it was in the house back in 2012 and i think it will be this congress and subcommittee that has that is going to lead the way to data security, section 230 reforms and of course senator klobuchar always wants to talk about antitrust and i have to give a nod. senator markey is down there, when we were in the house, we were probably two of the only ones who were talking about the need to have a federal privacy standard. now as the chairman mentioned, last week, we heard from miss davis, who had global safety for facebook. it was surprises to us that she
7:19 am
tried to minimize the information in these documents. to minimize the research. and to minimize the knowledge that facebook had. at one point, i even reminded her the research was not third party research. the research was their, facebook's, internal research. so they knew what they were doing. they knew where the violations were and they know they are guilty. their research tells them this. last week in advance of our hearing, facebook released two studies and said that "the wall street journal" was all wrong. they had just gotten it wrong. as if "the wall street journal" did not know how to read these documents and how to work through this research.
7:20 am
having seen the data that you've presented and the other studies that facebook did not publicly share, i feel pretty confident that it's facebook who has done the misrepresenting to this committee. here are some of the numbers that facebook chose not to chair and mr. chairman, i think it's important that we look at these as we talk about the setting for this hearing. what we learned last week, what you and i have been learning over the past three years about big tech and facebook. and here you go. 66% of teen girls on instagram and 40% of teen boys experience negative social comparisons. this is facebook's research. 52% of teen girls who experience negative social comparison on instagram said it was caused by
7:21 am
images related to beauty. social comparison is worse on instagram because it is perceived as real life, but based on celebrity standards. social comparison mimics the grief cycle and includes a downward emotional spiral encompassing the range of emotions from jealousy to self-proclaimed body dismorphia. which facebook called problematic use. is most severe in teens piquing at age 14. facebook is not interested in making significant changes to improve kids' safety on their platforms. at least not when that would result in losing eyeballs on posts or decreasing their administrative revenues.
7:22 am
in fact, facebook is running scared as they know that in their own words, young adults are less active and less engaged on facebook and that they are running out of teens to add to instagram. so teens are looking at other platforms like tiktok and facebook is only making those changes that add to its user's numbers and ultimately its profits. follow the money. so what are these changes? allowing users to create multiple accounts that facebook does not delete and encouraging teens to create second accounts they can hide from their parents. they are also studying younger and younger children, as young as 8, so that they can market to them. and while miss davis says that kids below 13 are not allowed on facebook or instagram, we know that they are because she told
7:23 am
us that they recently had deleted 600,000 accounts from children under age 13. so how do you get that many underage accounts if you aren't turning a blind eye to them in the first place? then in order to try to clean it up, you go to delete it then you say oh, by the way, we just in the last month, deleted 600,000 under age accounts. and speaking of turning a blind eye, facebook turns a blind eye to user privacy. news broke yesterday that the private data of over 1.5 billion, that's right, 1.5 billion facebook users is being sold on a hacking forum. that's its biggest data breach to date. examples like this underscore my
7:24 am
strong concerns about facebook collecting the data of kids and teens teens and what they are doing with it. facebook also turns a blind eye toward blatant human exploitation, taking place on its platform, trafficking, forced labor, cartels. the worst possible things one can imagine. big tech companies have gotten away with abusing consumers for too long. it is clear that facebook prioritizes profit over the well-being of children and all users. so as a mother and a grandmother, this is an issue that is of particular concern to me. so we thank you for being here today, miss haugen, and we look forward to getting to the truth about what facebook is doing with users dayta and with how they are abusing their privacy
7:25 am
and how they show a lack of respect for the individuals that are on their network. we look forward to the testimony. thank you, mr. chairman. >> thanks, senator blackburn. i don't know whether ranking member would like to make a -- >> if you don't mind. thank you, chairman blumenthal and i will just take a moment or two. and i do appreciate being able to speak as ranking member of the full committee. this, miss haugen, this is a subcommittee hearing. you see some vacant seats. this is pretty good attendance for a subcommittee. there are also a lot of things going on so people will be coming and going, but i'm willing to predict this will have almost 100% attendance. by members of the subcommittee because of the importance of this subject matter, so thanks for coming forward to share concerns about facebook's business practices. particularly with respect to children and teens and of course
7:26 am
that is the main topic of our, the title of our hearing today. protecting kids online. the recent revelations about facebook's effects on children and its plan to target younger audiences are indeed disturbing and i think you're going to see a lot of bipartisan concern about this today and in future hearings. they just, they show how urgent it is for congress to act against powerful tech companies on behalf of children and the broader public. and i say powerful tech companies, they are possessive of immense, immense power. their product is addictive and people on both sides of this dias are concerned about this.
7:27 am
talked to an opinionmaker just down the hall a few moments before the hearing. this person said the tech gods have been demystified now. and i think this hearing today, mr. chair, is part of the process of demystifying big tech. the children of america are hooked on their product. it is often destructive and harmful and there's a cynical knowledge on behalf of the leadership of these big tech companies that that is true. miss haugen, i hope you will have a chance to talk about your work experience at facebook and perhaps compare it to other social media companies. i also look forward to hearing your thoughts on how this committee and how this congress can ensure greater accountability and transparency,
7:28 am
especially with regard to children. so thank you, mr. chairman and thank you, miss haugen, for being here today. >> our witness this morning is frances haugen. she she was the lead project information on the civic information team. she holds a degree from owen college and mba from harvard. she made the courageous decision, as all of us here and many others around the world know, to leave facebook and reveal the terrible truths about the company she learned during her tenure there. i think we are all in agreement here in expressing our gratitude and our admiration for your bravery in coming forward. thank you, miss haugen. plead proceed.
7:29 am
>> good afternoon, chairman blumenthal, ranking member blackburn and members of the subcommittee. thank you for the opportunity to appear before you. my name is frances haugen. i used to work at facebook. i joined facebook because i think facebook has the potential to bring out the best in us, but i'm here today because i believe facebook's products harm children, stoke division, and weaken our democracy. the company's leadership knows how to make facebook and instagram safer, but won't make the necessary changes because they have put their astronomical profits before people. congressional action is needed. they won't solve this crisis without your help. yesterday we saw facebook get taken off the internet. i don't know why it wdown, but know for more than five hours, it wasn't used to deepen divides and make young girls and women
7:30 am
feel bad about their bodies. it also means that millions of small businesses weren't able to reach potential customers and countless photos of new babies weren't joyously celebrated around the world. i believe in the potential of facebook. we can have social media we enjoy that connects us without tearing apart our democracy, putting our children in danger and sewing ethnic violence around the world. we can do better. i have worked as a product manager at large tech companies since 2006 including google, pinterest and facebook. it's focused on recommendation systems like the one that powers the facebook news feed. having worked on four different type of social networks, i understand how complex and nuanced these problems are. however, the choices being made inside of facebook are disastrous. for our children, our public
7:31 am
safety, for our privacy, and for our democracy, and that is why we must demand facebook make changes. during my time at facebook, first working as the lead product manager for civic misinformation and later on counterespionage, i saw facebook repeatedly encounter conflicts between its own profits and our safety. facebook consistently resolved these conflicts in favor of its own profits. the result has been more division, more harm, more lies, more threats, and more combat. in some cases, this dangerous online talk has led to actual violence that harms and even kills people. this is not simply a matter of certain social media users being angry or unstable or about one side being radicalized against the other. it is about facebook choosing to grow at all costs, becoming an almost trillion dollar company by buying its profits with our safety.
7:32 am
during my time at facebook, i came to realize the devastating truth. almost no one outside of facebook knows what happens inside of facebook. the company intentionally hides vital information from the public, from the u.s. government, and from governments around the world. the documents i have provided to congress prove that facebook has repeatedly misled the public about what it's own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages. i came forward because i believe that every human being deserves the dignity of truth. the severity of this crisis demands that we break out of our previous regulatory frames. facebook wants to trick you into thinking that privacy protections alone will be sufficient. while important, these will not get to the core of the issue, which is that no one truly understands the destructive choices made by facebook except
7:33 am
facebook. we can afford nothing less than full transparency. as long as facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. until the incentives change, facebook will not change. left alone, facebook will continue to make choices that go against the common good. our common good. when we realize big tobacco was hiding the armharms it caused, government took action. when we figured out cars were safer with seat belts, the government took action and when we learned opioids were taking lives, the government took action. today, facebook shapes our perception of the world by choosing the information we see. even those who don't news facebook are impacted by the majority who do. a company with such frightening influence over people, their
7:34 am
thoughts and behavior, needs real oversight, but facebook's closed design means it has no real oversight. only facebook knows how it personalizes your feed for you. at other large tech companies like google, any independent researcher can download from the internet search results and write papers about what they find, but facebook hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system. facebook will tell you privacy means they can't give you data. this is not true. when tobacco companies claimed that filtered cigarettes were safer for consumers, scientists could independently invalidate these marketing messages and confirm that in fact, they posed a greater threat to human health. the public cannot do the same with facebook. we are given no other option than to take their marketing messages on blind faith. not only does the company hide
7:35 am
most of its own data, my disclosure has proved that when facebook is directly asked questions as important as how do you impact the health and safety of our children, they mislead and they choose to mislead and misdirect. facebook has not earned our blind faith. this inability to see into facebook's actual systems and confirm how they work as communicated is like the department of transportation regulating cars by only watching them drive down the highway. today, no regulator has a menu of solutions for how to fix facebook because facebook didn't want them to know about what's causing the problems. otherwise there wouldn't have been a need for a whistleblower. how is the public supposed to assess if facebook is resolving conflicts of interest in a way that is aligned with the public
7:36 am
good if the public has no visibility into how facebook operates? this must change. facebook wants you to believe that the problems we're talking about are unsolvable. they want you to believe in false choices. they want you to believe you must choose between a content full of divisive content. that you must choose between public oversight of facebook's choices and your personal privacy. that to be able to share fun photos of your kids with old friends, you must also be inundated with anger driven viralty. they want you to believe this is just part of the deal. i am here today to tell you that's not true. these problems are solvable. a safer, free speech respecting more enjoyable social media is possible, but there's one thing that i hope everyone takes away from these disclosures.
7:37 am
it is that facebook can change, but it's clearly not going to do so on its own. my fear is that without action, divisive and extremist behaviors we see today are only the beginning. what we saw in myanmar and are now seeing in ethiopia are only the opening chapters of a story so terrifying no one wants to read the end of it. congress can change the rules that facebook plays by and stop the many harms it is now causing. we now know the truth about facebook's destructive impact. i really appreciate the seriousness which the members of congress and the securities and exchange commission are approaching these issues. i came forward at great personal risk because i believe we still have time to act, but we must act now. i'm asking you, our elected representatives, to act. thank you. >> thank you for taking that personal risk and we will do
7:38 am
anything and everything to protect and stop any retaliation against you and any legal action that company made, or anyone else, made that i think for clear. in the course of these proceedings. i want to ask you about this idea of disclosure. you talked about a car going down the road. we're going to have questions, maybe a second round if you're willing to do it. we're here today to look under the hood. that's what we need to do more. in august, senator blackburn and i wrote to mark zuckerberg and asked him pretty straightforward questions about how the company works and safeguards children and teens on instagram, facebook dots. side track.
7:39 am
in effect misled us. so i'm going to ask you a few straightforward questions to break down some of what you said. if you can answer them yes or no, that would be great. is facebook's research ever found that its platforms can have a negative affect on children and teen's mental health or well-being? >> many of facebook's internal research reports indicate that facebook has a serious negative harm on a significant portion of teenagers and children. >> and has facebook ever offered features that it knew had negative effect on children's and teens mental health? >> facebook knows its amplification algorithms can lead children from very inokous
7:40 am
topics like recipes, to anorexia promoting content over a very short period of time. >> and has facebook ever found again, in its research, that kids show sign of addiction on instagram? >> facebook has studied a pattern they call problematic use. what we might call addiction. it has a very high bar for what it believes it is. you self-identify that you don't have control over your usage and it's harming your health, school work on your physical health. 5 to 6% of 14-year-olds have the self-awareness to admit both those questions. it is likely that far more than 5 to 6% of 14-year-olds are addicted to instagram. >> last thursday, my colleagues and i asked miss davis, who was
7:41 am
representing facebook, about how the decision we had made whether to pause permanently instagram for kids and she said quote, there's no one person who makes a decision like that. we think about it that collaboratively. it's as though she couldn't mention mark zuckerberg's name. isn't he the one that will be making this decision from your experience in the company? >> mark holds a very unique role in the tech industry in that he holds over 55% of all the voting shares for facebook. there are no similarly powerful companies that are as unilaterally controlled. and in the end, the buck stops with mark. there's no one currently holding him accountable but himself. >> and mark zuckerberg is in fact the algorithm designer in chief, correct? >> i received an mba from harvard and they emphasized to us that we are responsible for
7:42 am
the organization's that we build. mark has built an organization that is very metrics driven. it is intended to be flat. there's no unilateral responsibility. the metrics make the decision. unfortunately, that itself is a decision. and in the end if he's the ceo and chairman of facebook, he is responsible for those decisions. >> the buck stops with him. >> the buck stops with him. >> and speaking of the buck stopping, you have said that facebook should declare moral bankruptcy. i agree. its actions and its failure to acknowledge its responsibility indicate moral bankruptcy. >> there is a cycle occurring inside the company where facebook has struggled for a long time to recruit and retain the number of employees it needs to tackle the large scope of projects that it has chosen to take on. facebook is stuck in a cycle
7:43 am
where it struggles to hire. that causes it to understaff projects, which causes scandals which then makes it harder to hire. part of why facebook needs to come out and say we did something wrong, we made some choices we regret is the only way we can move forward and heal facebook is we first have to admit the truth. that way we'll have reconciliation and move forward is by first being honest and declaring moral bankruptcy. >> and acknowledging that facebook has caused and aggrevated pains to make more money. and it has profited off spreading disinformation and misinformation and sewing hate. facebook's answers to facebook's destructive impact always seems to be more facebook. we need more facebook. which means more pain. and more money for facebook. would you agree? >> i don't think at any point
7:44 am
facebook set out to make a destructive platform. i think it is a challenge of, that facebook has set up an organization where the parts of the organization responsible for growing and expanding the organization are separate and not regularly cross pollinated with parts of the company that focus on the harms the company is causing and as a result, regularly integrity actions, projects that were hard fought by the teams trying to keep us safe are undone by new growth projects that counteract those same remedies. so i think it's a thing of there are organizational problems that need oversight and basically to move forward to a more healthy place. >> whether it's teens bullied into suicidal thoughts or myanmar or fanning the flames of division in our own country or
7:45 am
europe, they were ultimately responsible for the immorality of the pain it's caused. >> facebook needs to take responsibility for the consequences of its choices. it needs to be willing to accept small tradeoffs on profit and i think just that act of being able to admit that it's a mixed bag is important and i think what bwe saw from antigone, instead of focusing on the good they do, admit they have responsibilities to always remedy the harm. >> but mark zuckerberg's new policy is no apologies, no admissions, no acknowledgment. nothing to see here. we're going to deflect it and go sailing. i turn to the ranking member. >> thank you, mr. chairman. thank you for your testimony. i want to stay with miss davis
7:46 am
and some of her comments because i had asked her last week about the unz age users. she had made the comment, i'm going to quote, if we find and account of someone under 13, we remove them. in the last three months, we removed 600,000 of under 13-year-olds, entd quote. i have to tell you, it seems to me that there's a problem if you have 600,000 from children who ought not to be there in the first place. so what did mark zuckerberg know about facebook's plans to bring kids on as new users and advertise to them? >> there are reports within facebook that show cohort analysis that examine what ages do people join facebook and instagram. based on those, so facebook
7:47 am
likes to say children lie about their ages to get on to the platform. the reality is enough kids tell the truth that you can work backwards to figure out what are approximately the real ages of anyone who's on the platform. one facebook does cohort angel ce analysis and looks back to 10 to 15% of 10-year-olds in a given cohort may be on facebook or instagram. >> this is why the ceo of instagram would have replied to jo-jo seewa, she said i've been on instagram since i was 8. he said i didn't want to know that. it would be for this reason, correct? >> a pattern of behavior i saw at facebook, often problems were so understaffed that there was kind of an implicit discouragement from having
7:48 am
better systems. my last team was on the counterespionage team and at any given time, our team could only handle a third of the cases that we knew about. we knew that if we built even a basic detecter, we would likely have many more cases. >> let me ask you this. you look at the way they have the data, but they're choosing to keep it and advertise from it. sell it to third parties. so what does facebook do? you've got these 600,000 accounts that ought not to be on there. >> probably more. >> right. but then you delete those accounts, but what happens to that data? does facebook keep that data? do they keep it until those children go to age 13?
7:49 am
since they can work backward and figure out the true age of a user. so what do they do with it? do they delete it? store it? do they keep it? how do they process that? >> i am -- my understanding of facebook's data retention policies, i want to be clear, i didn't want directly on that. when they delete an account, they delete in compliance. with regard to children underage on the platform, they detect more of those children and they should have to publish for congress those processes because they could be much more effective. >> got it. now, staying with this underage children, since this hearing is all about kids and about online privacy, i want you to tell me how facebook is able to do
7:50 am
market research on these children that are under age 13 because miss davis was really, she didn't deny this last week. so how are they doing this? do they bring kids into focus groups with their parents? how do they get that permission? she said they got permission from parents. is there a permission slip or a form that gets signed and then how do they know which kids to target? >> there's a bunch to unpack there. we'll start with how they recruit children or teenagers for focus groups. most of the groups i read were
7:51 am
around message kids. those appear to be children interacting in person. often, companies use sourcing agencies that will go and identify people who meet certain demographic criteria or reach out directly based on data on the platform. so for example, the case of messenger kids, maybe you'd want to study a child that was an active user and one that was a less active user. >> so these are children under age 13. >> yeah. >> and they know it. >> for some of these studies. i assume they get permission, but i don't work on it. >> well, we're still waiting to get a copy of that parental consent form that would involve children. my time has expired. i'll save my other questions for a second round if we're able to get those. >> thank you. senator klobuchar. >> thank you very much, mr. chairman. thank you so much, miss haugen for shedding a light on how
7:52 am
facebook time and time again as put profit. when they say it made teen girls thoughts of suicide worse, they made instagram kids, when they found out their algorithms are fostering polarization, misinformation and hate that they allowed 99% of their content to remain unchecked on their platform, including lead up to the january 6th insurrection, what did they do? now mark zuckerberg is going sailing. the time has come for action and i think you are the catalyst. you have said privacy legislation is not enough. i completely agree with you, but i think you know, we have not done anything to update our privacy laws in this country. our federal privacy laws.
7:53 am
nothing. zilch. in any major way. why? because there are lobbyists around every corner of this bui building that have been hired be i the tech industry. we have done nothing when it comes to making the algorithms more transparent allowing for the university research. why? because facebook and other tech companies are throwing a bunch of money around this town and people are listening to them. we have done nothing significantly passed although we are on a bipartisan basis working to get something done on consolidation, which you understand allow it is dominant platforms to control all this like the bullies in the neighborhood, buy out the companies that maybe could have competed with them, and added the bells and whistles. so the time for action is now. so i'll start with something that i asked facebook's head of safety when she testified last week. i asked her how they estimate the lifetime value of a user for kids who start using their
7:54 am
products before they turn 13. she evaded the question and said that's not the way we think about it. is that right or is it your experience that facebook estimates and puts a value on how much money they get from users in general? i'll get to kids in a second. is that a motivating force for them? >> based on what i saw of allocation integrity spending, in "the wall street journal," it's like 87% is spent on english but only about 9% are english speakers. it seems that facebook invests more in users who make them more money. even though the danger may not be evenly distributed based on profitability. >> does it make sense that having a younger person get hooked on social media a young age makes them more profitable over the long-term as they have a life ahead of them? >> facebook's internal documents talk about the importance of getting younger users, for
7:55 am
example, tween, on to instagram like instagram kids because they know children bring their parents online. they understand the value of younger users for the long-term success of facebook. >> they reported revenue 58.50% per user. when i asked how much of that came from users under 18, she wouldn't say. do you think that teens are profitable for their company? >> i would sayassume. substantially higher rates for customers who don't yet have preferences or habits. so i'm sure they're some of the more profitable users, but i don't work directly on that. >> another major issue that's come out of this. eating disorders. studies found that they have the highest mortality rate of any mental illness for women. and i led a bill on this with
7:56 am
senators kavanagh and baldwin that we passed into law and i'm concerned that this algorithms they have pushed outrageous content promoting anorexia and the like. i know it's personal to you. do you think their algorithms push this content to young girls? >> facebook knows engagement based ranking, the way they pick the content in instagram for young users, for all users, amplifies preferences. they've done a proactive incident response where they take things that they heard, for example, can you be led by the algorithms to anorexia content and they have literally recreated this experiment and confirmed yes, this happens to people. so facebook knows that they are leading young users to anorexia content. >> do you think they are deliberately designing their project to be addictive beyond even the content? >> facebook has a long history of having a successful and
7:57 am
effective growth division where they take little tiny tweaks and are constantly trying to optimize it to grow. those kinds of stickiness can be construed as things that promote addiction. >> last thing, as we've seen this same kind of content in the political world, you brought up other countries and what's been happening there. on 60 minutes, you said facebook implemented safeguards to reduce misinformation ahead of the 2020 election, but turned off those safeguards right after the election. and you know that the i insurrection occurred january 6 lt. do you think they turned it off because it was reducing profits? >> facebook has been emphasizing a false choice. they said the safeguards in place before the election implicated free speech. the choices happening on the f platform were about how reactive and how twitchy was the
7:58 am
platform. how viral. facebook changed those safety defaults in the run up to the election because they knew they were dangerous and because they wanted that growth back, they wanted the acceleration of the platform back after the election, they returned to their original defaults. and the fact that they had to break the glass on january 6th and turn them back on, i think that's deeply problematic. >> agree. thank you very much for your bravery and coming forward. >> senator thune. >> thank you, mr. chair, and ranking member blackburn. i've been arguing for some time that it is time for congress to act and i think the question is always what is the correct way to do it. the right way to do it, consistent with our first amendment right to free speech. this -- i'm not averse to looking at the monopolistic division of facebook. i think that's a real issue that
7:59 am
needs to be examined and perhaps addressed. at least under this committee, there are a a couple of things think we can do. i have a piece of legislation and senators blackburn and blumenthal are both co-sponsors. called the filter bubble transparency act. essentially it would give users the options to engage with social media platforms without being manipulated by the secret formulas that essentially dictate the content that you see when you open up an app or logon to a website. we also, i think, need to hold big tech accountable by reforming section 230 and one of the best opportunities i think to do that, at least in a bipartisan way, is the platform accusedablety consumer transparency, the pact act. that's legislation i've co-sponsored with is that right shots. that a court determines to be ill illegal, it could increase
8:00 am
transparency around the content moderation process. importantly, witthe act would explore the viability for a federal program for big tech employees to blow the whistle on wrong doing inside the companies where they work. we should encourage employees inside the tech sector like you to speak up about questionable practices of big tech companies, so we can among other thing, ensure americans of how they're using artificial intelligence and opaque algorithms to keep them hook oed on the platform. we learned from the information that you provided that facebook conducts what's called engagement based ranking, which you've described as very dangerous. can you talk more about why it's dangerous and do you think congress should seek to pass legislation like the filter bubble act that would giver users the ability to avoid

25 Views

info Stream Only

Uploaded by TV Archive on